**** dT 0.000 * top TEST ../../../../bin/varnishtest/tests/c00090.vtc starting **** top extmacro def pkg_version=trunk **** top extmacro def pkg_branch=trunk **** top extmacro def pwd=/home/linux1/VT/varnish-cache/varnish-trunk/_build/sub/bin/varnishtest **** top extmacro def date(...) **** top extmacro def string(...) **** top extmacro def localhost=127.0.0.1 **** top extmacro def bad_backend=127.0.0.1:32927 **** top extmacro def listen_addr=127.0.0.1:0 **** top extmacro def bad_ip=192.0.2.255 **** top extmacro def topbuild=/home/linux1/VT/varnish-cache/varnish-trunk/_build/sub **** top extmacro def topsrc=/home/linux1/VT/varnish-cache/varnish-trunk/_build/sub/../.. **** top macro def testdir=/home/linux1/VT/varnish-cache/varnish-trunk/_build/sub/bin/varnishtest/../../../../bin/varnishtest/tests **** top macro def tmpdir=/home/linux1/VT/_vtest_tmp/vtc.1479599.6e79a8ec **** top macro def vtcid=vtc.1479599.6e79a8ec ** top === varnishtest "Forcing health of backends listening at UDS" * top VTEST Forcing health of backends listening at UDS ** top === server s1 -listen "${tmpdir}/s1.sock" -repeat 3 { ** s1 Starting server **** s1 macro def s1_addr=0.0.0.0 **** s1 macro def s1_port=0 **** s1 macro def s1_sock=/home/linux1/VT/_vtest_tmp/vtc.1479599.6e79a8ec/s1.sock * s1 Listen on /home/linux1/VT/_vtest_tmp/vtc.1479599.6e79a8ec/s1.sock ** top === varnish v1 -vcl { **** dT 0.005 ** s1 Started on /home/linux1/VT/_vtest_tmp/vtc.1479599.6e79a8ec/s1.sock (3 iterations) **** dT 0.013 ** v1 Launch *** v1 CMD: cd ${pwd} && exec varnishd -d -n /home/linux1/VT/_vtest_tmp/vtc.1479599.6e79a8ec/v1 -i v1 -l 2m -p auto_restart=off -p syslog_cli_traffic=off -p thread_pool_min=10 -p debug=+vtc_mode -p vsl_mask=+Debug,+H2RxHdr,+H2RxBody -p h2_initial_window_size=1m -p h2_rx_window_low_water=64k -a '127.0.0.1:0' -M '127.0.0.1 38969' -P /home/linux1/VT/_vtest_tmp/vtc.1479599.6e79a8ec/v1/varnishd.pid -p vmod_path=/home/linux1/VT/varnish-cache/varnish-trunk/_build/sub/vmod/.libs *** v1 CMD: cd /home/linux1/VT/varnish-cache/varnish-trunk/_build/sub/bin/varnishtest && exec varnishd -d -n /home/linux1/VT/_vtest_tmp/vtc.1479599.6e79a8ec/v1 -i v1 -l 2m -p auto_restart=off -p syslog_cli_traffic=off -p thread_pool_min=10 -p debug=+vtc_mode -p vsl_mask=+Debug,+H2RxHdr,+H2RxBody -p h2_initial_window_size=1m -p h2_rx_window_low_water=64k -a '127.0.0.1:0' -M '127.0.0.1 38969' -P /home/linux1/VT/_vtest_tmp/vtc.1479599.6e79a8ec/v1/varnishd.pid -p vmod_path=/home/linux1/VT/varnish-cache/varnish-trunk/_build/sub/vmod/.libs *** v1 PID: 1479618 **** v1 macro def v1_pid=1479618 **** v1 macro def v1_name=/home/linux1/VT/_vtest_tmp/vtc.1479599.6e79a8ec/v1 **** dT 0.021 *** v1 debug|Info: Working directory not mounted on tmpfs partition *** v1 debug| **** dT 0.033 *** v1 debug|Debug: Version: varnish-trunk revision 4c8fb86287a0e9c9ebe066bc5e69b586f39bcb39 *** v1 debug|Debug: Platform: Linux,4.18.0-553.47.1.el8_10.s390x,s390x,-jlinux,-sdefault,-sdefault,-hcritbit *** v1 debug|200 329 *** v1 debug|----------------------------- *** v1 debug|Varnish Cache CLI 1.0 *** v1 debug|----------------------------- *** v1 debug|Linux,4.18.0-553.47.1.el8_10.s390x,s390x,-jlinux,-sdefault,-sdefault,-hcritbit *** v1 debug|varnish-trunk revision 4c8fb86287a0e9c9ebe066bc5e69b586f39bcb39 *** v1 debug| *** v1 debug|Type 'help' for command list. *** v1 debug|Type 'quit' to close CLI session. *** v1 debug|Type 'start' to launch worker process. *** v1 debug| **** dT 0.135 **** v1 CLIPOLL 1 0x1 0x0 0x0 *** v1 CLI connection fd = 7 *** v1 CLI RX 107 **** v1 CLI RX|xrxgpmenzvcxzrvfitwhpvnkihdyvhlw **** v1 CLI RX| **** v1 CLI RX|Authentication required. **** v1 CLI TX|auth de6a4ba5928e9f9220b02e69824ac12fbac4e413a7a2ac96981387d6e4165ef4 **** dT 0.140 *** v1 CLI RX 200 **** v1 CLI RX|----------------------------- **** v1 CLI RX|Varnish Cache CLI 1.0 **** v1 CLI RX|----------------------------- **** v1 CLI RX|Linux,4.18.0-553.47.1.el8_10.s390x,s390x,-jlinux,-sdefault,-sdefault,-hcritbit **** v1 CLI RX|varnish-trunk revision 4c8fb86287a0e9c9ebe066bc5e69b586f39bcb39 **** v1 CLI RX| **** v1 CLI RX|Type 'help' for command list. **** v1 CLI RX|Type 'quit' to close CLI session. **** v1 CLI RX|Type 'start' to launch worker process. **** dT 0.141 **** v1 CLI TX|vcl.inline vcl1 << %XJEIFLH|)Xspa8P **** v1 CLI TX|vcl 4.1; **** v1 CLI TX| **** v1 CLI TX|\tbackend s1 { **** v1 CLI TX|\t\t.path = "/home/linux1/VT/_vtest_tmp/vtc.1479599.6e79a8ec/s1.sock"; **** v1 CLI TX|\t\t.probe = { **** v1 CLI TX|\t\t\t.window = 8; **** v1 CLI TX|\t\t\t.initial = 7; **** v1 CLI TX|\t\t\t.threshold = 8; **** v1 CLI TX|\t\t\t.interval = 10s; **** v1 CLI TX|\t\t} **** v1 CLI TX|\t} **** v1 CLI TX| **** v1 CLI TX|\tsub vcl_recv { **** v1 CLI TX|\t\treturn(pass); **** v1 CLI TX|\t} **** v1 CLI TX| **** v1 CLI TX| **** v1 CLI TX|%XJEIFLH|)Xspa8P **** dT 0.245 *** v1 vsl|No VSL chunk found (child not started ?) **** dT 0.347 *** v1 vsl|No VSL chunk found (child not started ?) **** dT 0.452 *** v1 vsl|No VSL chunk found (child not started ?) **** dT 0.555 *** v1 vsl|No VSL chunk found (child not started ?) **** dT 0.657 *** v1 vsl|No VSL chunk found (child not started ?) **** dT 0.759 *** v1 vsl|No VSL chunk found (child not started ?) **** dT 0.861 *** v1 vsl|No VSL chunk found (child not started ?) **** dT 0.962 *** v1 vsl|No VSL chunk found (child not started ?) **** dT 1.065 *** v1 vsl|No VSL chunk found (child not started ?) **** dT 1.166 *** v1 vsl|No VSL chunk found (child not started ?) **** dT 1.267 *** v1 vsl|No VSL chunk found (child not started ?) **** dT 1.376 *** v1 vsl|No VSL chunk found (child not started ?) **** dT 1.478 *** v1 vsl|No VSL chunk found (child not started ?) **** dT 1.581 *** v1 vsl|No VSL chunk found (child not started ?) **** dT 1.686 *** v1 vsl|No VSL chunk found (child not started ?) **** dT 1.788 *** v1 vsl|No VSL chunk found (child not started ?) **** dT 1.820 *** v1 CLI RX 200 **** v1 CLI RX|VCL compiled. **** v1 CLI TX|vcl.use vcl1 **** dT 1.825 *** v1 CLI RX 200 **** v1 CLI RX|VCL 'vcl1' now active ** v1 Start **** v1 CLI TX|start **** dT 1.881 *** v1 debug|Debug: Child (1479631) Started **** dT 1.888 *** v1 vsl|No VSL chunk found (child not started ?) **** dT 1.919 *** v1 debug|Child launched OK **** dT 1.935 *** v1 CLI RX 200 *** v1 wait-running **** v1 CLI TX|status *** v1 debug|Info: Child (1479631) said Child starts **** dT 1.939 *** v1 CLI RX 200 **** v1 CLI RX|Child in state running **** v1 CLI TX|debug.listen_address **** dT 1.980 *** v1 CLI RX 200 **** v1 CLI RX|a0 127.0.0.1 38075 **** v1 CLI TX|debug.xid 1000 **** dT 1.989 **** v1 vsl| 0 CLI - Rd vcl.load "vcl1" vcl_vcl1.1744975426.604828/vgc.so 1auto **** v1 vsl| 0 Backend_health - s1 Went sick -------H 7 8 8 0.000000 0.000000 "" **** v1 vsl| 0 Backend_health - s1 Still sick -------H 7 8 8 0.000000 0.000000 "" **** v1 vsl| 0 CLI - Wr 200 52 Loaded "vcl_vcl1.1744975426.604828/vgc.so" as "vcl1" **** v1 vsl| 0 CLI - Rd vcl.use "vcl1" **** v1 vsl| 0 CLI - Wr 200 0 **** v1 vsl| 0 CLI - Rd start **** v1 vsl| 0 Debug - sockopt: Setting SO_LINGER for a0=127.0.0.1:38075 **** v1 vsl| 0 Debug - sockopt: Setting SO_KEEPALIVE for a0=127.0.0.1:38075 **** v1 vsl| 0 Debug - sockopt: Setting SO_SNDTIMEO for a0=127.0.0.1:38075 **** v1 vsl| 0 Debug - sockopt: Setting SO_RCVTIMEO for a0=127.0.0.1:38075 **** v1 vsl| 0 Debug - sockopt: Setting TCP_NODELAY for a0=127.0.0.1:38075 **** v1 vsl| 0 Debug - sockopt: Setting TCP_KEEPIDLE for a0=127.0.0.1:38075 **** v1 vsl| 0 Debug - sockopt: Setting TCP_KEEPCNT for a0=127.0.0.1:38075 **** v1 vsl| 0 Debug - sockopt: Setting TCP_KEEPINTVL for a0=127.0.0.1:38075 **** v1 vsl| 0 CLI - Wr 200 0 **** v1 vsl| 0 CLI - Rd debug.listen_address **** v1 vsl| 0 CLI - Wr 200 19 a0 127.0.0.1 38075 **** dT 2.034 *** v1 CLI RX 200 **** v1 CLI RX|XID is 1000 chunk 1 **** v1 CLI TX|debug.listen_address **** dT 2.085 *** v1 CLI RX 200 **** v1 CLI RX|a0 127.0.0.1 38075 ** v1 Listen on 127.0.0.1 38075 **** v1 macro def v1_addr=127.0.0.1 **** v1 macro def v1_port=38075 **** v1 macro def v1_sock=127.0.0.1:38075 **** v1 macro def v1_a0_addr=127.0.0.1 **** v1 macro def v1_a0_port=38075 **** v1 macro def v1_a0_sock=127.0.0.1:38075 ** top === delay 1 *** top delaying 1 second(s) **** dT 2.090 **** v1 vsl| 0 CLI - Rd debug.xid 1000 **** v1 vsl| 0 CLI - Wr 200 19 XID is 1000 chunk 1 **** v1 vsl| 0 CLI - Rd debug.listen_address **** v1 vsl| 0 CLI - Wr 200 19 a0 127.0.0.1 38075 **** dT 3.088 ** top === varnish v1 -cliok "vcl.list" **** v1 CLI TX|vcl.list **** dT 3.156 *** v1 CLI RX 200 **** v1 CLI RX|active auto warm 0 vcl1 ** v1 CLI 200 ** top === varnish v1 -cliok "backend.list -p" **** v1 CLI TX|backend.list -p **** dT 3.204 *** v1 CLI RX 200 **** v1 CLI RX|Backend name Admin Probe Health Last change **** v1 CLI RX|vcl1.s1 probe 7/8 sick Fri, 18 Apr 2025 11:23:48 GMT **** v1 CLI RX| Current states good: 7 threshold: 8 window: 8 **** v1 CLI RX| Average response time of good probes: 0.000000 **** v1 CLI RX| Oldest ================================================== Newest **** v1 CLI RX| ---------------------------------------------------------------U Good UNIX **** v1 CLI RX| ---------------------------------------------------------------X Good Xmit **** v1 CLI RX| --------------------------------------------------------HHHHHHH- Happy **** v1 CLI RX| ** v1 CLI 200 ** top === varnish v1 -cliok "backend.set_health s1 auto" **** v1 CLI TX|backend.set_health s1 auto **** dT 3.216 **** v1 vsl| 0 CLI - Rd vcl.list **** v1 vsl| 0 CLI - Wr 200 32 active auto warm 0 vcl1 **** v1 vsl| 0 CLI - Rd backend.list -p **** v1 vsl| 0 CLI - Wr 200 517 Backend name Admin Probe Health Last change vcl1.s1 probe 7/8 sick Fri, 18 Apr 2025 11:23:48 GMT Current states good: 7 threshold: 8 window: 8 Average response time of good probes: 0.000000 Oldest ============ **** dT 3.255 *** v1 CLI RX 200 ** v1 CLI 200 ** top === varnish v1 -cliok "backend.list -p" **** v1 CLI TX|backend.list -p **** dT 3.308 *** v1 CLI RX 200 **** v1 CLI RX|Backend name Admin Probe Health Last change **** v1 CLI RX|vcl1.s1 probe 7/8 sick Fri, 18 Apr 2025 11:23:48 GMT **** v1 CLI RX| Current states good: 7 threshold: 8 window: 8 **** v1 CLI RX| Average response time of good probes: 0.000000 **** v1 CLI RX| Oldest ================================================== Newest **** v1 CLI RX| ---------------------------------------------------------------U Good UNIX **** v1 CLI RX| ---------------------------------------------------------------X Good Xmit **** v1 CLI RX| --------------------------------------------------------HHHHHHH- Happy **** v1 CLI RX| ** v1 CLI 200 ** top === client c1 { ** c1 Starting client ** c1 Waiting for client **** dT 3.313 ** c1 Started on 127.0.0.1:38075 (1 iterations) *** c1 Connect to 127.0.0.1:38075 *** c1 connected fd 17 from 127.0.0.1 37994 to 127.0.0.1:38075 ** c1 === txreq **** c1 txreq|GET / HTTP/1.1\r **** c1 txreq|Host: 127.0.0.1\r **** c1 txreq|User-Agent: c1\r **** c1 txreq|\r ** c1 === rxresp **** dT 3.317 **** v1 vsl| 0 CLI - Rd backend.set_health s1 auto **** v1 vsl| 0 CLI - Wr 200 0 **** v1 vsl| 0 CLI - Rd backend.list -p **** v1 vsl| 0 CLI - Wr 200 517 Backend name Admin Probe Health Last change vcl1.s1 probe 7/8 sick Fri, 18 Apr 2025 11:23:48 GMT Current states good: 7 threshold: 8 window: 8 Average response time of good probes: 0.000000 Oldest ============ **** v1 vsl| 1000 Begin c sess 0 HTTP/1 **** v1 vsl| 1000 SessOpen c 127.0.0.1 37994 a0 127.0.0.1 38075 1744975429.776181 23 **** v1 vsl| 1000 Debug c sockopt: SO_LINGER may be inherited for a0=127.0.0.1:38075 **** v1 vsl| 1000 Debug c sockopt: SO_KEEPALIVE may be inherited for a0=127.0.0.1:38075 **** v1 vsl| 1000 Debug c sockopt: SO_SNDTIMEO may be inherited for a0=127.0.0.1:38075 **** v1 vsl| 1000 Debug c sockopt: SO_RCVTIMEO may be inherited for a0=127.0.0.1:38075 **** v1 vsl| 1000 Debug c sockopt: TCP_NODELAY may be inherited for a0=127.0.0.1:38075 **** v1 vsl| 1000 Debug c sockopt: TCP_KEEPIDLE may be inherited for a0=127.0.0.1:38075 **** v1 vsl| 1000 Debug c sockopt: TCP_KEEPCNT may be inherited for a0=127.0.0.1:38075 **** v1 vsl| 1000 Debug c sockopt: TCP_KEEPINTVL may be inherited for a0=127.0.0.1:38075 **** v1 vsl| 1000 Link c req 1001 rxreq **** dT 3.324 **** c1 rxhdr|HTTP/1.1 503 Backend fetch failed\r **** c1 rxhdr|Date: Fri, 18 Apr 2025 11:23:49 GMT\r **** c1 rxhdr|Server: Varnish\r **** c1 rxhdr|Content-Type: text/html; charset=utf-8\r **** c1 rxhdr|Retry-After: 5\r **** c1 rxhdr|X-Varnish: 1001\r **** c1 rxhdr|Age: 0\r **** c1 rxhdr|Via: 1.1 v1 (Varnish/trunk)\r **** c1 rxhdr|Content-Length: 281\r **** c1 rxhdr|Connection: keep-alive\r **** c1 rxhdr|\r **** c1 rxhdrlen = 246 **** c1 http[ 0] |HTTP/1.1 **** c1 http[ 1] |503 **** c1 http[ 2] |Backend fetch failed **** c1 http[ 3] |Date: Fri, 18 Apr 2025 11:23:49 GMT **** c1 http[ 4] |Server: Varnish **** c1 http[ 5] |Content-Type: text/html; charset=utf-8 **** c1 http[ 6] |Retry-After: 5 **** c1 http[ 7] |X-Varnish: 1001 **** c1 http[ 8] |Age: 0 **** c1 http[ 9] |Via: 1.1 v1 (Varnish/trunk) **** c1 http[10] |Content-Length: 281 **** c1 http[11] |Connection: keep-alive **** c1 c-l| **** c1 c-l| **** c1 c-l| **** c1 c-l| 503 Backend fetch failed **** c1 c-l| **** c1 c-l| **** c1 c-l|

Error 503 Backend fetch failed

**** c1 c-l|

Backend fetch failed

**** c1 c-l|

Guru Meditation:

**** c1 c-l|

XID: 1002

**** c1 c-l|
**** c1 c-l|

Varnish cache server

**** c1 c-l| **** c1 c-l| **** c1 bodylen = 281 ** c1 === expect resp.status == 200 ---- c1 EXPECT resp.status (503) == "200" failed **** dT 3.327 * top RESETTING after ../../../../bin/varnishtest/tests/c00090.vtc ** s1 Waiting for server (4/-1) **** dT 3.422 **** v1 vsl| 1002 Begin b bereq 1001 pass **** v1 vsl| 1002 VCL_use b vcl1 **** v1 vsl| 1002 Timestamp b Start: 1744975429.778824 0.000000 0.000000 **** v1 vsl| 1002 BereqMethod b GET **** v1 vsl| 1002 BereqURL b / **** v1 vsl| 1002 BereqProtocol b HTTP/1.1 **** v1 vsl| 1002 BereqHeader b Host: 127.0.0.1 **** v1 vsl| 1002 BereqHeader b User-Agent: c1 **** v1 vsl| 1002 BereqHeader b X-Forwarded-For: 127.0.0.1 **** v1 vsl| 1002 BereqHeader b Via: 1.1 v1 (Varnish/trunk) **** v1 vsl| 1002 BereqHeader b X-Varnish: 1002 **** v1 vsl| 1002 VCL_call b BACKEND_FETCH **** v1 vsl| 1002 VCL_return b fetch **** v1 vsl| 1002 Timestamp b Fetch: 1744975429.778848 0.000024 0.000024 **** v1 vsl| 1002 FetchError b backend s1: unhealthy **** v1 vsl| 1002 Timestamp b Beresp: 1744975429.778855 0.000030 0.000006 **** v1 vsl| 1002 Timestamp b Error: 1744975429.778856 0.000032 0.000001 **** v1 vsl| 1002 BerespProtocol b HTTP/1.1 **** v1 vsl| 1002 BerespStatus b 503 **** v1 vsl| 1002 BerespReason b Backend fetch failed **** v1 vsl| 1002 BerespHeader b Date: Fri, 18 Apr 2025 11:23:49 GMT **** v1 vsl| 1002 BerespHeader b Server: Varnish **** v1 vsl| 1002 VCL_call b BACKEND_ERROR **** v1 vsl| 1002 BerespHeader b Content-Type: text/html; charset=utf-8 **** v1 vsl| 1002 BerespHeader b Retry-After: 5 **** v1 vsl| 1002 VCL_return b deliver **** v1 vsl| 1002 Storage b malloc Transient **** v1 vsl| 1002 Length b 281 **** v1 vsl| 1002 BereqAcct b 0 0 0 0 0 0 **** v1 vsl| 1002 End b **** v1 vsl| 1001 Begin c req 1000 rxreq **** v1 vsl| 1001 Timestamp c Start: 1744975429.776228 0.000000 0.000000 **** v1 vsl| 1001 Timestamp c Req: 1744975429.776228 0.000000 0.000000 **** v1 vsl| 1001 VCL_use c vcl1 **** v1 vsl| 1001 ReqStart c 127.0.0.1 37994 a0 **** v1 vsl| 1001 ReqMethod c GET **** v1 vsl| 1001 ReqURL c / **** v1 vsl| 1001 ReqProtocol c HTTP/1.1 **** v1 vsl| 1001 ReqHeader c Host: 127.0.0.1 **** v1 vsl| 1001 ReqHeader c User-Agent: c1 **** v1 vsl| 1001 ReqHeader c X-Forwarded-For: 127.0.0.1 **** v1 vsl| 1001 ReqHeader c Via: 1.1 v1 (Varnish/trunk) **** v1 vsl| 1001 VCL_call c RECV **** v1 vsl| 1001 VCL_return c pass **** v1 vsl| 1001 VCL_call c HASH **** v1 vsl| 1001 VCL_return c lookup **** v1 vsl| 1001 VCL_call c PASS **** v1 vsl| 1001 VCL_return c fetch **** v1 vsl| 1001 Link c bereq 1002 pass **** v1 vsl| 1001 Timestamp c Fetch: 1744975429.779672 0.003443 0.003443 **** v1 vsl| 1001 RespProtocol c HTTP/1.1 **** v1 vsl| 1001 RespStatus c 503 **** v1 vsl| 1001 RespReason c Backend fetch failed **** v1 vsl| 1001 RespHeader c Date: Fri, 18 Apr 2025 11:23:49 GMT **** v1 vsl| 1001 RespHeader c Server: Varnish **** v1 vsl| 1001 RespHeader c Content-Type: text/html; charset=utf-8 **** v1 vsl| 1001 RespHeader c Retry-After: 5 **** v1 vsl| 1001 RespHeader c X-Varnish: 1001 **** v1 vsl| 1001 RespHeader c Age: 0 **** v1 vsl| 1001 RespHeader c Via: 1.1 v1 (Varnish/trunk) **** v1 vsl| 1001 VCL_call c DELIVER **** v1 vsl| 1001 VCL_return c deliver **** v1 vsl| 1001 Timestamp c Process: 1744975429.779699 0.003471 0.000027 **** v1 vsl| 1001 Filters c **** v1 vsl| 1001 RespHeader c Content-Length: 281 **** v1 vsl| 1001 RespHeader c Connection: keep-alive **** v1 vsl| 1001 Timestamp c Resp: 1744975429.779733 0.003505 0.000033 **** v1 vsl| 1001 ReqAcct c 51 0 51 246 281 527 **** v1 vsl| 1001 End c **** dT 3.570 ** v1 Wait **** v1 CLI TX|panic.show **** dT 3.621 *** v1 CLI RX 300 **** v1 CLI RX|Child has not panicked or panic has been cleared *** v1 debug|Info: manager stopping child *** v1 debug|Debug: Stopping Child **** dT 3.624 **** v1 vsl| 0 CLI - EOF on CLI connection, worker stops **** dT 3.725 *** v1 debug|Info: Child (1479631) said Child dies *** v1 debug|Info: Child (1479631) ended *** v1 debug|Debug: Child cleanup complete *** v1 debug|Info: manager dies **** v1 STDOUT EOF **** dT 3.730 ** v1 WAIT4 pid=1479618 status=0x0000 (user 0.241613 sys 0.038222) * top TEST ../../../../bin/varnishtest/tests/c00090.vtc FAILED # top TEST ../../../../bin/varnishtest/tests/c00090.vtc FAILED (3.731) exit=2 FAIL tests/c00090.vtc (exit status: 2)